1. Hilbert Space and Inner Product, Operator Theory

This document covers key topics ranging from the fundamental concepts of Hilbert space to the definitions and properties of inner product, orthogonality, important operators (Hermitian, unitary, projector), and spectral theorem and trace. Each concept is explained with both abstract definitions and concrete examples to aid understanding.


1. Fundamental Concepts

Hilbert Space
A Hilbert space is a vector space with a defined inner product, and it is a complete space with respect to the completeness property derived from this inner product. This concept naturally extends the familiar Euclidean space to an infinite-dimensional space and forms the mathematical foundation of functional analysis, Fourier analysis, and quantum mechanics.

Detailed Explanation: What is completeness?
Completeness means that every Cauchy sequence within the space converges to a point within the same space. Intuitively, if the terms of a sequence get closer and closer to each other, then the destination they are heading toward must also exist within the space.
For example, the set of rational numbers \(\mathbb{Q}\) is not complete. Consider a rational sequence converging to \(\pi\) (e.g., 3, 3.1, 3.14, …). The terms get closer and closer, but the limit value \(\pi\) is not a rational number, so it does not exist within \(\mathbb{Q}\). In contrast, the set of real numbers \(\mathbb{R}\) is complete, so this issue does not occur. The completeness of a Hilbert space ensures the ability to freely use limit operations such as differentiation and integration.

Inner Product
An inner product is an operation that maps two vectors to a scalar, assigning geometric concepts such as length, distance, and angle (orthogonality) to a vector space. An inner product must satisfy the following three properties:
* Conjugate Symmetry: \(\langle x,y \rangle = \overline{\langle y,x \rangle}\)
* Linearity in the first argument: \(\langle ax+by,z \rangle=a\langle x,z \rangle+b\langle y,z \rangle\)
* Positive-definiteness: \(\langle x,x \rangle \ge 0\) and \(\langle x,x \rangle=0 \iff x=0\)

Hermitian Operator
A self-adjoint operator satisfying \(A=A^\dagger\). In quantum mechanics, it represents measurable physical quantities (position, momentum, energy, etc.) and always has real eigenvalues, with eigenvectors corresponding to different eigenvalues being orthogonal.

Unitary Operator
An operator satisfying \(U^\dagger U=UU^\dagger=\mathbf{1}\), which is an isometry preserving the inner product and norm (length) of vectors. This corresponds to transformations that maintain the geometric structure, such as rotations or reflections in a vector space. In quantum mechanics, it describes the time evolution of a system or basis transformations.

Projector
An idempotent linear operator satisfying \(P^2=P\). If the condition \(P=P^\dagger\) is also satisfied, it is called an Orthogonal Projector, representing the shortest distance projection of a vector onto a specific subspace. Spectral Theorem In finite-dimensional Hilbert spaces, every Hermitian operator can be decomposed into the form \(A=\sum_{j} \lambda_j P_j\) using its eigenvalues and orthogonal projectors onto its eigenspaces. This is analogous to diagonalizing a matrix as \(A=U^\dagger DU\). This theorem allows for a complete understanding of an operator’s essential structure through its spectrum (set of eigenvalues).

Trace Defined as the sum of a matrix’s diagonal elements (\(\mathrm{Tr}(A) = \sum_i A_{ii}\)), it is an invariant independent of the choice of basis. An important property is cyclicity (\(\mathrm{Tr}(AB) = \mathrm{Tr}(BA)\)), and in quantum mechanics, the trace of the product of a density matrix and an observable is used to compute the expectation value of that observable.


2. Symbols and Key Relations

  • Inner Product and Norm:
    • \(\langle x,y \rangle = \overline{\langle y,x \rangle}\) (Conjugate symmetry)
    • \(\langle ax+by,z \rangle = a\langle x,z \rangle + b\langle y,z \rangle\) (Linearity)
    • \(\|x\| = \sqrt{\langle x,x \rangle}\) (Norm derived from the inner product)
    • \(x \perp y \iff \langle x,y \rangle = 0\) (Orthogonality condition)
  • Adjoint Operator and Important Operator Classes:
    • \(\langle Ax,y \rangle = \langle x, A^\dagger y \rangle\) (Definition of the adjoint operator \(A^\dagger\))
    • Hermitian: \(A = A^\dagger\)
    • Unitary: \(U^\dagger U=UU^\dagger=\mathbf{1}\)
    • Orthogonal Projector: \(P^2=P=P^\dagger\)
  • Orthonormal Basis and Completeness:
    • Given an orthonormal basis \(\{e_j\}\), any vector \(x\) can be expanded as \(x=\sum_j \langle e_j,x \rangle e_j\). Here, the coefficients \(\langle e_j,x \rangle\) represent the magnitude of the projection of \(x\) onto the direction of \(e_j\).
    • Parseval’s Identity: \(\|x\|^2 = \sum_j |\langle e_j,x \rangle|^2\). This states that the total squared norm of a vector equals the sum of the squared norms of its components in each basis direction, representing a generalization of the Pythagorean theorem to infinite dimensions.
  • Spectral Decomposition:
    • A finite-dimensional Hermitian operator \(A\) is decomposed as \(A = \sum_j \lambda_j P_j\).
    • Here, \(\lambda_j\) are the distinct eigenvalues of \(A\), and \(P_j\) are the orthogonal projectors onto the eigenspaces corresponding to \(\lambda_j\).
    • The projectors satisfy: \(P_i P_j = \delta_{ij} P_j\) (mutual orthogonality), \(\sum_j P_j = \mathbf{1}\) (completeness of the decomposition).
  • Properties of the Trace:
    • \(\mathrm{Tr}(A+B) = \mathrm{Tr}(A)+\mathrm{Tr}(B)\) (Linearity)
    • \(\mathrm{Tr}(AB) = \mathrm{Tr}(BA)\) (Cyclicity)
    • The trace is invariant under basis transformations: \(\mathrm{Tr}(U^\dagger AU) = \mathrm{Tr}(A)\)

3. Examples

  • Hilbert Space · Inner Product Example 1: \(\mathbb{C}^2\)
    • \(x=(1, i)^\mathsf{T}\), \(y=(2, 0)^\mathsf{T}\)
    • \(\langle x,y \rangle = \overline{x_1}y_1 + \overline{x_2}y_2 = \overline{1}\cdot2 + \overline{i}\cdot0 = 1\cdot2 + (-i)\cdot0 = 2\)
    • \(\|x\| = \sqrt{\langle x,x \rangle} = \sqrt{\overline{1}\cdot1+\overline{i}\cdot i} = \sqrt{1+(-i)(i)} = \sqrt{1+1} = \sqrt{2}\) > Reference: Inner Product of Complex Vectors > The reason for taking the conjugate of one vector when defining an inner product in a complex vector space is to ensure the positive definiteness of the norm (\(\|x\|^2 = \langle x,x \rangle \ge 0\)). If we did not take the conjugate, for \(x=(i,0)^{\mathsf T}\), \(\langle x,x \rangle = i^2 = -1\), making it impossible to define a length.
  • Hilbert Space · Inner Product Example 2: \(L^2[0,1]\)
    • The space of square-integrable functions on the interval \([0,1]\)
    • \(f(t)=1\), \(g(t)=t\)
    • \(\langle f,g \rangle = \int_0^1 \overline{f(t)}g(t)dt = \int_0^1 1 \cdot t dt = \left[\frac{1}{2}t^2\right]_0^1 = \frac{1}{2}\)
    • \(\|g\| = \sqrt{\int_0^1 |g(t)|^2 dt} = \sqrt{\int_0^1 t^2 dt} = \sqrt{\left[\frac{1}{3}t^3\right]_0^1} = \sqrt{\frac{1}{3}}\) > Detailed Explanation: Importance of \(L^2\) Space > \(L^p\) spaces are a type of function space where the size of functions is measured by \(\left(\int |f|^p\right)^{1/p}\). Among these, the \(L^2\) space with \(p=2\) is special because it is the only case where the norm can be derived from an inner product. Thanks to this, \(L^2\) space becomes a Hilbert space, allowing powerful geometric tools such as the Pythagorean theorem, orthogonal decomposition, and Fourier series to be applied to functions. This forms the core of many applications, such as finding solutions to partial differential equations or analyzing waves in signal processing.
  • Orthogonal Basis · Completeness Example 1: Standard Basis of \(\mathbb{C}^3\)
    • \(e_1=(1,0,0)^\mathsf{T}\), \(e_2=(0,1,0)^\mathsf{T}\), \(e_3=(0,0,1)^\mathsf{T}\)
    • \(\langle e_i,e_j \rangle=\delta_{ij}\) (Kronecker delta, 1 when \(i=j\), 0 otherwise), so it is an orthonormal basis.
    • Any vector \(x=(x_1,x_2,x_3)^{\mathsf T}\) can be expressed as \(x=x_1e_1+x_2e_2+x_3e_3\). In this case, the coefficients \(x_j\) are equal to \(x_j = \langle e_j, x \rangle\).
  • Orthogonal Basis·Completeness Example 2: Fourier basis of \(L^2[0,2\pi]\)
    • The function set \(\{\frac{e^{int}}{\sqrt{2\pi}}\}_{n\in\mathbb{Z}}\) forms an orthonormal basis for the \(L^2[0,2\pi]\) space.
    • Any function \(f(t)\) can be expanded as a Fourier series \(f(t) = \sum_{n=-\infty}^{\infty} c_n \frac{e^{int}}{\sqrt{2\pi}}\).
    • The Fourier coefficients \(c_n\) are calculated as \(c_n = \langle \frac{e^{int}}{\sqrt{2\pi}}, f(t) \rangle = \frac{1}{\sqrt{2\pi}}\int_0^{2\pi} e^{-int}f(t)dt\). This sum converges to \(f\) in the \(L^2\) norm sense.
  • Hermitian Example 1: Matrix \(A=\begin{pmatrix} 2 & 1+i \\ 1-i & 0 \end{pmatrix}\)
    • \(A^\dagger = (\overline{A})^\mathsf{T} = \overline{\begin{pmatrix} 2 & 1-i \\ 1+i & 0 \end{pmatrix}}^\mathsf{T} = \begin{pmatrix} 2 & 1+i \\ 1-i & 0 \end{pmatrix} = A\)
    • Therefore, \(A\) is a Hermitian matrix, and its eigenvalues are always real. (In fact, calculating yields \(3, -1\).)
  • Unitary Example 2: Hadamard Matrix \(H=\frac{1}{\sqrt{2}}\begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix}\)
    • \(H^\dagger H = \left(\frac{1}{\sqrt{2}}\begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix}\right) \left(\frac{1}{\sqrt{2}}\begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix}\right) = \frac{1}{2}\begin{pmatrix} 1+1 & 1-1 \\ 1-1 & 1+1 \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} = \mathbf{1}\)
    • \(H\) is a unitary matrix. It transforms the standard basis \(|0\rangle=(1,0)^{\mathsf T}\) and \(|1\rangle=(0,1)^{\mathsf T}\) into new orthonormal basis \(|+\rangle=\frac{1}{\sqrt{2}}(1,1)^{\mathsf T}\) and \(|-\rangle=\frac{1}{\sqrt{2}}(1,-1)^{\mathsf T}\).
  • Projector Example 1: Projection onto a 1-dimensional subspace
    • The projection operator \(P=|v\rangle\langle v|\) onto the line generated by the unit vector \(v=\tfrac{1}{\sqrt{2}}(1,1)^{\mathsf T}\)
    • \(P = \begin{pmatrix} 1/\sqrt{2} \\ 1/\sqrt{2} \end{pmatrix} \begin{pmatrix} 1/\sqrt{2} & 1/\sqrt{2} \end{pmatrix} = \frac{1}{2}\begin{pmatrix} 1 & 1 \\ 1 & 1 \end{pmatrix}\)
    • \(P^2 = \frac{1}{4}\begin{pmatrix} 1 & 1 \\ 1 & 1 \end{pmatrix} \begin{pmatrix} 1 & 1 \\ 1 & 1 \end{pmatrix} = \frac{1}{4}\begin{pmatrix} 2 & 2 \\ 2 & 2 \end{pmatrix} = P\) (idempotency)
    • \(P^\dagger = \frac{1}{2}\begin{pmatrix} 1 & 1 \\ 1 & 1 \end{pmatrix} = P\) (self-adjointness)
    • Therefore, \(P\) is an orthogonal projector.
  • Spectral Theorem Example 1: Pauli X matrix \(A=\sigma_x=\begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}\)
  • Eigenvalues: \(\det(A-\lambda I) = \lambda^2-1=0 \implies \lambda_1=1, \lambda_2=-1\)
  • Normalized Eigenvectors: \(v_1=\frac{1}{\sqrt{2}}\begin{pmatrix} 1 \\ 1 \end{pmatrix}\) (for \(\lambda_1=1\)), \(v_2=\frac{1}{\sqrt{2}}\begin{pmatrix} 1 \\ -1 \end{pmatrix}\) (for \(\lambda_2=-1\))
  • Projectors: \(P_1=|v_1\rangle\langle v_1|=\frac{1}{2}\begin{pmatrix} 1 & 1 \\ 1 & 1 \end{pmatrix}\), \(P_2=|v_2\rangle\langle v_2|=\frac{1}{2}\begin{pmatrix} 1 & -1 \\ -1 & 1 \end{pmatrix}\)
  • Spectral Decomposition: \(A=\lambda_1P_1 + \lambda_2P_2 = (1)\cdot P_1 + (-1)\cdot P_2 = \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}\)

4. Practice Problems and Solutions

Practice Problem 1. Inner Product and Norm: For \(x=(a,ib)^\mathsf{T}, y=(1,1)^\mathsf{T} \in\mathbb{C}^2\), find the conditions for \(\langle x,y \rangle, \|x\|\), and \(x\perp y\). 2. Orthogonal Basis: Express the condition that \(|\psi\rangle=(\alpha,\beta)^{\mathsf T}\) is normalized (\(\|\psi\|=1\)) as an equation, and represent it as an expansion using the standard basis \(\{e_1, e_2\}\). 3. Hermitian: Determine whether \(A=\begin{pmatrix}0&i\\-i&0\end{pmatrix}\) and \(B=\begin{pmatrix}1&2\\2&1\end{pmatrix}\) are Hermitian operators, and explain why eigenvalues must be real. 4. Unitary: Show that any arbitrary unitary operator \(U\) preserves inner product and norm, i.e., \(\langle U\phi,U\psi \rangle = \langle \phi,\psi \rangle\) and \(\|U\psi\|=\|\psi\|\). 5. Projector: When all elements of the 3×3 matrix \(J\) are 1, determine whether \(P=\frac{1}{3}J\) is an orthogonal projector (\(P^2=P, P^\dagger=P\)). 6. Projector Comparison: Determine whether \(Q=\begin{pmatrix}1&1\\0&0\end{pmatrix}\) is a projector and whether it is an orthogonal projector. 7. Spectral Decomposition: Find the eigenvalues and orthonormal eigenvectors of \(H=\begin{pmatrix}2&1\\1&2\end{pmatrix}\), and perform the spectral decomposition (\(H=\sum_j \lambda_j P_j\)) using them. 8. Expectation Value: Given the state vector \(|\psi\rangle=\cos\theta|0\rangle+\sin\theta|1\rangle\) and the Hermitian operator \(H\) from problem 7, express the expectation value \(\langle \psi|H|\psi \rangle\) as a function of \(\theta\). 9. Trace Cyclic Property: Prove \(\mathrm{Tr}(ABC)=\mathrm{Tr}(BCA)=\mathrm{Tr}(CAB)\), and provide a counterexample for a general permutation (e.g., ABC→ACB). 10. Projection Theorem: State that for a closed subspace \(M\) of a Hilbert space and an arbitrary vector \(x\), the closest element of \(M\) to \(x\) is uniquely given by the orthogonal projection \(Px\) (\(\|x-Px\|=\inf_{m\in M} \|x-m\|\)), and intuitively interpret this in \(\mathbb{R}^2\) with a diagram. 11. Unitary and Basis: Show that if \(\{e_j\}\) is an orthonormal basis, then the set \(\{Ue_j\}\) transformed by a unitary operator \(U\) is also an orthonormal basis. 12. Projector Spectrum: Prove that the eigenvalues of an orthogonal projector \(P\) are only 0 or 1.

Exercise Solutions 1. \(\langle x,y \rangle = \bar{a}\cdot 1 + \overline{(ib)}\cdot 1 = a-ib\). \(\|x\|=\sqrt{\langle x,x \rangle} = \sqrt{|a|^2+|ib|^2} = \sqrt{a^2+b^2}\). \(x\perp y \iff \langle x,y \rangle=0 \iff a-ib=0\). For real numbers \(a, b\), this implies \(a=0\) and \(b=0\), i.e., \(x=0\). 2. Normalization condition: \(\|\psi\|^2 = \langle\psi,\psi\rangle=|\alpha|^2+|\beta|^2=1\). Standard basis expansion: \(|\psi\rangle=\alpha e_1+\beta e_2\). 3. \(A^\dagger = \overline{\begin{pmatrix}0&-i\\i&0\end{pmatrix}}^\mathsf{T} = \begin{pmatrix}0&i\\-i&0\end{pmatrix} = A\). \(B^\dagger = \begin{pmatrix}1&2\\2&1\end{pmatrix}^\mathsf{T} = B\). Both are Hermitian. Reason: If \(A\) is Hermitian and \(Ax = \lambda x\), then \(\lambda\langle x,x \rangle = \langle x,\lambda x \rangle = \langle x,Ax \rangle = \langle A^\dagger x,x \rangle = \langle Ax,x \rangle = \langle \lambda x,x \rangle = \bar{\lambda}\langle x,x \rangle\). Since \(\|x\| \neq 0\), \(\lambda = \bar{\lambda}\), i.e., \(\lambda\) is real. 4. \(\langle U\phi,U\psi \rangle = \langle \phi,U^\dagger(U\psi) \rangle = \langle \phi,(U^\dagger U)\psi \rangle = \langle \phi, \mathbf{1}\psi \rangle = \langle \phi,\psi \rangle\). \(\|U\psi\|^2 = \langle U\psi,U\psi \rangle = \langle \psi,\psi \rangle = \|\psi\|^2 \implies \|U\psi\| = \|\psi\|\). 5. Since all elements of \(J\) are 1, \(J^2 = 3J\) holds. \(P^2 = (\frac{1}{3}J)^2 = \frac{1}{9}J^2 = \frac{1}{9}(3J) = \frac{1}{3}J = P\). (Idempotency holds) Since \(J\) is a real symmetric matrix, \(J^\dagger = J\), and therefore \(P^\dagger = P\). (Self-adjointness holds) Thus, \(P\) is an orthogonal projector. 6. \(Q^2 = \begin{pmatrix}1&1\\0&0\end{pmatrix}\begin{pmatrix}1&1\\0&0\end{pmatrix}=\begin{pmatrix}1&1\\0&0\end{pmatrix}=Q\). Since it is idempotent, it is a projector. However, \(Q^\dagger = \begin{pmatrix}1&0\\1&0\end{pmatrix} \neq Q\), so it is not an orthogonal projector. (This corresponds to an oblique projection.) 7. Eigenvalues: \(\det(H-\lambda I)=(2-\lambda)^2-1=0 \implies \lambda=1,3\). Eigenvectors: \(\lambda_1=3 \to v_1=\frac{1}{\sqrt{2}}(1,1)^\mathsf{T}\), \(\lambda_2=1 \to v_2=\frac{1}{\sqrt{2}}(1,-1)^\mathsf{T}\). Projector: \(P_1=|v_1\rangle\langle v_1|=\frac{1}{2}\begin{pmatrix}1&1\\1&1\end{pmatrix}\), \(P_2=|v_2\rangle\langle v_2|=\frac{1}{2}\begin{pmatrix}1&-1\\-1&1\end{pmatrix}\). Decomposition: \(H = 3P_1 + 1P_2\). 8. Trace: \(\operatorname{Tr}(ABC) = \operatorname{Tr}(BCA) = \operatorname{Tr}(CAB)\). This property allows the cyclic permutation of matrices in the trace. 9. \(\operatorname{Tr}(ABC) = \operatorname{Tr}(BCA) = \operatorname{Tr}(CAB)\). This property allows the cyclic permutation of matrices in the trace. 10. Hilbert Projection Theorem: Let \(M\) be a closed convex subset of a Hilbert space \(H\). Then for any \(x \in H\), there exists a unique \(y \in M\) such that \(\|x - y\| = \inf_{z \in M} \|x - z\|\). The vector \(y\) is the orthogonal projection of \(x\) onto \(M\). 11. Eigenvalues of a projector: For a projector \(P\), the eigenvalues are \(\lambda = 0\) or \(\lambda = 1\). If \(P\) is a projector, then \(P^2 = P\), and the eigenvalues of \(P\) satisfy \(\lambda^2 = \lambda\), which implies \(\lambda = 0\) or \(\lambda = 1\). 12. Eigenvalues of a projector: For a projector \(P\), the eigenvalues are \(\lambda = 0\) or \(\lambda = 1\). If \(P\) is a projector, then \(P^2 = P\), and the eigenvalues of \(P\) satisfy \(\lambda^2 = \lambda\), which implies \(\lambda = 0\) or \(\lambda = 1\). 13. Eigenvalues of a projector: For a projector \(P\), the eigenvalues are \(\lambda = 0\) or \(\lambda = 1\). If \(P\) is a projector, then \(P^2 = P\), and the eigenvalues of \(P\) satisfy \(\lambda^2 = \lambda\), which implies \(\lambda = 0\) or \(\lambda = 1\). 14. Eigenvalues of a projector: For a projector \(P\), the eigenvalues are \(\lambda = 0\) or \(\lambda = 1\). If \(P\) is a projector, then \(P^2 = P\), and the eigenvalues of \(P\) satisfy \(\lambda^2 = \lambda\), which implies \(\lambda = 0\) or \(\lambda = 1\). 15. Eigenvalues of a projector: For a projector \(P\), the eigenvalues are \(\lambda = 0\) or \(\lambda = 1\). If \(P\) is a projector, then \(P^2 = P\), and the eigenvalues of \(P\) satisfy \(\lambda^2 = \lambda\), which implies \(\lambda = 0\) or \(\lambda = 1\). ### 💡 Note: Example of Calculating Expectation Value Using the Trace (Trace) Operation

In quantum mechanics, the average value (expectation value) of a physical quantity measured for a specific state can be simply calculated using the density matrix and the observable operator.

  • State (Density Matrix): \(\rho = \begin{pmatrix} p & 0 \\ 0 & 1-p \end{pmatrix}\) (Meaning: A mixed state where the probability of being in state \(|0\rangle\) is \(p\), and the probability of being in state \(|1\rangle\) is \(1-p\))

  • Observable: \(\sigma_z = \begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix}\) (Meaning: A physical quantity that yields measurement values of +1 or -1)

At this point, the expectation value of \(\sigma_z\) is calculated as \(\mathrm{Tr}(\rho\sigma_z)\).

$ (_z) = ( \[\begin{pmatrix} p & 0 \\ 0 & 1-p \end{pmatrix} \begin{pmatrix} 1 & 0 \\ 0 & -1 \end{pmatrix}\] ) = ( \[\begin{pmatrix} p & 0 \\ 0 & -(1-p) \end{pmatrix}\]

) $

Trace is the sum of all diagonal components of a matrix, so

$ (_z) = p + (-(1-p)) = p - 1 + p = 2p - 1 $

Physical Meaning of This Calculation

This result exactly matches the classical expectation value calculation.

Expectation Value = (Value 1) × (Probability of Value 1) + (Value 2) × (Probability of Value 2)

\(= (+1) \times p + (-1) \times (1-p)\)

\(= 2p - 1\)

In this way, the Trace (\(\mathrm{Tr}\)) operation serves as a powerful and consistent mathematical tool for extracting the average physical quantity of a system through seemingly complex matrix multiplication.